488 research outputs found
Soft interaction model and the LHC data
Most models for soft interactions which were proposed prior to the
measurements at the LHC, are only marginally compatible with LHC data, our GLM
model has the same deficiency. In this paper we investigate possible causes of
the problem, by considering separate fits to the high energy (),
and low energy () data. Our new results are moderately higher
than our previous predictions. Our results for total and elastic cross sections
are systematically lower that the recent Totem and Alice published values,
while our results for the inelastic and forward slope agree with the data. If
with additional experimental data, the errors are reduced, while the central
cross section values remain unchanged, we will need to reconsider the physics
on which our model is built.Comment: 12 pp, 12 figures in .eps file
Specifying and Verifying Concurrent Algorithms with Histories and Subjectivity
We present a lightweight approach to Hoare-style specifications for
fine-grained concurrency, based on a notion of time-stamped histories that
abstractly capture atomic changes in the program state. Our key observation is
that histories form a partial commutative monoid, a structure fundamental for
representation of concurrent resources. This insight provides us with a
unifying mechanism that allows us to treat histories just like heaps in
separation logic. For example, both are subject to the same assertion logic and
inference rules (e.g., the frame rule). Moreover, the notion of ownership
transfer, which usually applies to heaps, has an equivalent in histories. It
can be used to formally represent helping---an important design pattern for
concurrent algorithms whereby one thread can execute code on behalf of another.
Specifications in terms of histories naturally abstract granularity, in the
sense that sophisticated fine-grained algorithms can be given the same
specifications as their simplified coarse-grained counterparts, making them
equally convenient for client-side reasoning. We illustrate our approach on a
number of examples and validate all of them in Coq.Comment: 17 page
Exclusive double-diffractive production of open charm in proton-proton and proton-antiproton collisions
We calculate differential cross sections for exclusive double diffractive
(EDD) production of open charm in proton-proton and proton-antiproton
collisions. Sizeable cross sections are found. The EDD contribution constitutes
about 1 % of the total inclusive cross section for open charm production. A few
differential distributions are shown and discussed. The EDD contribution falls
faster both with transverse momentum of the quark/antiquark and the invariant mass than in the inclusive case.Comment: 11 pages, 7 figure
Survival probability for exclusive central diffractive production of colorless states at the LHC
In this paper we discuss the survival probability for exclusive central
diffractive production of a colorless small size system at the LHC. This
process has a clear signature of two large rapidity gaps. Using the eikonal
approach for the description of soft interactions, we predict the value of the
survival probability to be about 5~6% for single channel models, while for a
two channel model the survival probability is about 3%. The dependence of the
survival probability factor (damping factor) on the transverse momenta of the
recoiled protons is discussed, and we suggest it be measured at the Tevatron so
as to minimize the possible ambiguity in the calculation of survival
probability at the LHC.Comment: 33 pages, 26 figure
Unitarity Corrections to the Proton Structure Functions through the Dipole Picture
We study the dipole picture for the description of the deep inelastic
scattering, focusing on the structure functions which are driven directly by
the gluon distribution. One performs estimates using the effective dipole cross
section given by the Glauber-Mueller approach in QCD, which encodes the
corrections due to the unitarity effects associated with the saturation
phenomenon. We also address issues about frame invariance of the calculations
when analysing the observables.Comment: 16 pages, 8 figures. Version to be published in Phys. Rev.
Linearizability with Ownership Transfer
Linearizability is a commonly accepted notion of correctness for libraries of
concurrent algorithms. Unfortunately, it assumes a complete isolation between a
library and its client, with interactions limited to passing values of a given
data type. This is inappropriate for common programming languages, where
libraries and their clients can communicate via the heap, transferring the
ownership of data structures, and can even run in a shared address space
without any memory protection. In this paper, we present the first definition
of linearizability that lifts this limitation and establish an Abstraction
Theorem: while proving a property of a client of a concurrent library, we can
soundly replace the library by its abstract implementation related to the
original one by our generalisation of linearizability. This allows abstracting
from the details of the library implementation while reasoning about the
client. We also prove that linearizability with ownership transfer can be
derived from the classical one if the library does not access some of data
structures transferred to it by the client
The survival probability of large rapidity gaps in a three channel model
The values and energy dependence for the survival probability of large rapidity gaps (LRG) are calculated in a three channel model. This
model includes single and double diffractive production, as well as elastic
rescattering. It is shown that decreases with increasing
energy, in line with recent results for LRG dijet production at the Tevatron.
This is in spite of the weak dependence on energy of the ratio .Comment: 26 pages in latex file,11 figures in eps file
Novel Mechanism of Nucleon Stopping in Heavy Ion Collisions
When a diquark does not fragment directly but breaks in such a way that only
one of its quarks gets into the produced baryon, the latter is produced closer
to mid rapidities. The relative size of this diquark breaking component
increases quite fast with increasing energy. We show that at a given energy it
also increases with the atomic mass number and with the centrality of the
collision and that it allows to explain the rapidity distribution of the net
baryon number (-) in central collisions. Predictions for
- collisions are presented.Comment: 10 pages, Latex file and 6 PostSript figures uuencoded in one fil
Saturation Effects in Deep Inelastic Scattering at low and its Implications on Diffraction
We present a model based on the concept of saturation for small and
small . With only three parameters we achieve a good description of all Deep
Inelastic Scattering data below . This includes a consistent treatment
of charm and a successful extrapolation into the photoproduction regime. The
same model leads to a roughly constant ratio of diffractive and inclusive cross
section.Comment: 24 pages, 12 figures, Latex-fil
Tighter Relations Between Sensitivity and Other Complexity Measures
Sensitivity conjecture is a longstanding and fundamental open problem in the
area of complexity measures of Boolean functions and decision tree complexity.
The conjecture postulates that the maximum sensitivity of a Boolean function is
polynomially related to other major complexity measures. Despite much attention
to the problem and major advances in analysis of Boolean functions in the past
decade, the problem remains wide open with no positive result toward the
conjecture since the work of Kenyon and Kutin from 2004.
In this work, we present new upper bounds for various complexity measures in
terms of sensitivity improving the bounds provided by Kenyon and Kutin.
Specifically, we show that deg(f)^{1-o(1)}=O(2^{s(f)}) and C(f) < 2^{s(f)-1}
s(f); these in turn imply various corollaries regarding the relation between
sensitivity and other complexity measures, such as block sensitivity, via known
results. The gap between sensitivity and other complexity measures remains
exponential but these results are the first improvement for this difficult
problem that has been achieved in a decade.Comment: This is the merged form of arXiv submission 1306.4466 with another
work. Appeared in ICALP 2014, 14 page
- …